Dropout regularization in hierarchical mixture of experts

نویسندگان

چکیده

Dropout is a very effective method in preventing overfitting and has become the go-to regularizer for multi-layer neural networks recent years. Hierarchical mixture of experts hierarchically gated model that defines soft decision tree where leaves correspond to nodes gating models softly choose between its children, as such, hierarchical partitioning input space. In this work, we propose variant dropout faithful hierarchy defined by model, opposed having flat, unitwise independent application one with perceptrons. We show on synthetic regression data MNIST, CIFAR-10, SSTB datasets, our proposed mechanism prevents trees many levels improving generalization providing smoother fits.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mixture of Experts Classication Using a Hierarchical Mixture Model

A three-level hierarchical mixture model for classiŽcation is presented that models the following data generation process: (1) the data are generated by a Žnite number of sources (clusters), and (2) the generation mechanism of each source assumes the existence of individual internal class-labeled sources (subclusters of the external cluster). The model estimates the posterior probability of cla...

متن کامل

Mixture of Experts Classification Using a Hierarchical Mixture Model

A three-level hierarchical mixture model for classification is presented that models the following data generation process: (1) the data are generated by a finite number of sources (clusters), and (2) the generation mechanism of each source assumes the existence of individual internal class-labeled sources (subclusters of the external cluster). The model estimates the posterior probability of c...

متن کامل

Regularization and Error Bars for the Mixture of Experts Network

Viswanath Ramamurti and Joydeep Ghosh Department of Electrical and Computer Engineering The University of Texas at Austin, Austin, TX 78712-1084. E-mail: fviswa,[email protected] Abstract The mixture of experts architecture provides a modular approach to function approximation. Since di erent experts get attuned to di erent regions of the input space during the course of training, and ...

متن کامل

Advances in using hierarchical mixture of experts for signal classification

The Hierarchical mixture of experts(HME) architecture is a powerful tree structured architecture for supervised learning. In this paper, an eecient one-pass algorithm to solve the M-step of the EM iterations while training the HME network to perform classiication tasks, is rst described. This substantially reduces the training time compared to using the IRLS method to solve the M-step. Further,...

متن کامل

Data Dependent Risk Bounds for Hierarchical Mixture of Experts Classifiers

The hierarchical mixture of experts architecture provides a flexible procedure for implementing classification algorithms. The classification is obtained by a recursive soft partition of the feature space in a data-driven fashion. Such a procedure enables local classification where several experts are used, each of which is assigned with the task of classification over some subspace of the feat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2021

ISSN: ['0925-2312', '1872-8286']

DOI: https://doi.org/10.1016/j.neucom.2020.08.052